For images, use the requestImageDataAndOrientation(for:options:resultHandler:) method to obtain a photo’s data, which contains the HDR information. Then you can use CGImageSource, UIImageReader with prefersHighDynamicRange, CIImage, etc. to obtain a UIImage that will correctly display on HDR screens.
For live photos, it’s more complicated. The PHLivePhoto object obtained from requestLivePhoto(for:targetSize:contentMode:options:resultHandler:) does not contain an HDR image. You can use requestImageDataAndOrientation(for:options:resultHandler:) to obtain the HDR version of the image, as above, but displaying it in a PHLivePhotoView is not as simple. There is private API you can use, which I’ll post below, but it’s up to you to decide if that’s risky for you. All private API can be hidden, if the only thing you fear is App Store submission.
fileprivate static func fudgeLiveView(_ liveView: PHLivePhotoView, with image: UIImage?) {
if let image {
liveView.setValue(image, forKeyPath: "playerView.overrideImage")
}
guard let view = liveView.value(forKeyPath: "playerView.subviews.@firstObject") as? UIView else {
return
}
for subview in view.subviews {
if let imageView = subview as? UIImageView {
imageView.preferredImageDynamicRange = .high
}
}
}